Goto

Collaborating Authors

 laplacian eigenmap


Total Variation Classes Beyond 1d: Minimax Rates, and the Limitations of Linear Smoothers

Neural Information Processing Systems

We consider the problem of estimating a function defined over $n$ locations on a $d$-dimensional grid (having all side lengths equal to $n^{1/d}$). When the function is constrained to have discrete total variation bounded by $C_n$, we derive the minimax optimal (squared) $\ell_2$ estimation error rate, parametrized by $n, C_n$. Total variation denoising, also known as the fused lasso, is seen to be rate optimal. Several simpler estimators exist, such as Laplacian smoothing and Laplacian eigenmaps. A natural question is: can these simpler estimators perform just as well?





APr o of s A.1 Proof of theorem 1 W œS

Neural Information Processing Systems

B . 15 E -Prior In this case the prior reads: P A.3 Proof of theorem 2 We consider the following hierarchical model, for The above problem is non-convex because of the rank constraint (17). We focus on finding the optimal eigenvectors first. Note that this solution is not unique if there are repeated eigenvalues. A.4 Proof of Corollary 1 With the presented hierarchical model (Figure 1), the coupling problem is the following: min Note that the solution does not depend on Á . Figure 3: Graphical representation of the hierarchical model considered in section 4.2.





On Differentially Private Graph Sparsification and Applications

Raman Arora, Jalaj Upadhyay

Neural Information Processing Systems

In this paper, we study private sparsification of graphs. In particular, we give an algorithm that given an input graph, returns a sparse graph which approximates the spectrum of the input graph while ensuring differential privacy.